Conversation
purple4reina
left a comment
There was a problem hiding this comment.
Question. How does LMI handle cold starts then? Does the customer ever see their invocations waiting on a cold start? or does it always do something like proactive init?
|
Question. Will you also be doing this for python? I assume the universal languages will be handled already as their cold start spans are created in the extension. |
As provisioned concurrency, or when a sandbox is proactively initialized, that's how the span would look on the trace, now we are avoiding having this experience in LMI specifically
Yes, and for those, the change has been already made on the extension |
What does this PR do?
Omit creating cold start tracing on LMI
Motivation
We want to provide a better experience when a new sandbox is initialized and the gap between invocation and init is quite long, and SVLS-8482
Testing Guidelines
Unit tests
Additional Notes
Metrics are not affected since we still want to know how many sandbox are span up.
Types of Changes
Check all that apply